CCNEWS3.3
Release Notes – WDLS 3.3
Table of Contents
Account Management and Metrics 4
Financial Reporting, Invoicing, Accounts Receivable 11
Release Notes – WDLS 3.3
Featured Enhancements
Driver Signature Capture
All forms of core bills produced by WDLS - Standard Bills, Master Bills, VICS Bills - can be set up to produce the bills as PDF files. Just as other PDF output produced by WDLS, these are placed in a subdirectory that falls within a tree structure under the published reporting root and the account. Logic was introduced that instead places the PDF files in a pending location and inserts a signature block in the document. An option can then be taken from the work with orders or dock appointment panels that display the bill on the shipping clerk's PC and a Topaz signature pad attached to that PC can be used by the driver to digitally sign the bill. The signed bill can then be printed to provide a driver’s copy and then saved. Once it is signed, the logic will relocate it to the publication directory.
API Middleware Tooling
We are encountering more and more requirements to interface with customers and other systems using socket communications via internet connections. These exchanges can be cumbersome to establish, and each organization has a lot of uniqueness in how they operate and their Application Programming Interfaces (API). We have developed tooling that streamlines the process of making these connections.
Once the connections are made and the communications pipe is established, the data mapping must be handled. We have had basic function in place for several years to parse and assemble Extended Markup Language (XML), but that standard has evolved and there is new complexity. In addition, Java Script Object Notation (JSON) is becoming a popular notation. IBM has introduced capabilities in the database to read and write these formats. We built tooling to tap into this new capability and extended our message definition capability to better facilitate the schema used.
Voice Implementation
Interfacing logic was built to communicate with the TopVox Voice application. The interface was built to perform transaction-based dialogs to accomplish picks, batch picks, replenishments, and count backs. An associate using a voice unit operates in a very similar fashion to an RF operator. The setup is basically identical, and an operator can switch between an RF and voice unit. Dialogs are constructed in a table and maintenance is available to tweak voice prompts and messages.
Inventory Management
ABC Throughput Analysis Extract
A new form of the throughput analysis report was added that allows extraction of item volumes and turn percentages. This report is produced in extract form and produces a CSV file which is emailed and can be used for further analysis and layout planning. Additional fields that were added to the throughput data are percentage of inventory turned, activity date range, current pick line, zone, and product code.
Expanded Chemical Names
Some of these chemical names are getting out of hand. Sixty-character names have proven to be insufficient. A second 60-character field was added so that proper descriptions for paperwork could be constructed. This allows the proper name "ACRYLIC ACID, SODIUM2-ACRYLAMIDO-2-METHYL-1-PROPANESULFONATE AND SODIUMPHOSPHINATE" to be correctly reflected on the bill of lading.
Drum Processing Enhancements
The drum logic was revisited. Drums are a type of non-mandatory serialization. It allows you to capture drum activity both inbound and outbound and is a method of tracking only items that require drum recording. Activity histories and auditing information have enriched visibility. Note that drums, unlike serial numbers or license places, are not located and are not a valid inventory model. They are simply used to communicate specific number associated with an inbound or outbound transaction.
Item Based Order Cleanup
WDLS provides a good deal of operational flexibility. Due to lot granularity, it is common practice to mix lots within a warehouse location, especially in a forward pick location. There is a setting that directs a picker to the appropriate location but allows them to pick any lot. This means that stock commitment assumed one lot, but the pick may have picked another. The existing logic cleaned up the order details, recorded the correct lot, and de-committed the lot that was not taken once the order was completely picked. During this interim, other orders might commit to the lot taken as opposed to the lot that is already there. The cleanup process was re-written so that cleanup can be done to the order line once the line is picked. This improves the stock accuracy so that future commitments are making the best choices according to the business rules.
Lot Specific Shelf Life
Yes, it happened. In addition to all the granularity we have been seeing, we are encountering customers where shelf life can no longer be defined as an item attribute. On a lot-by-lot basis, the shelf life can vary. A lot status file was added to the system to track this. A lot status holding file was also introduced to allow the lot specifics to be mapped via 943 processing. A web portal program and inbound 947 process were introduced that allow customers to supply lot specifics or to make stock condition changes.
Lot Validity and Substitution
Lot codes are very often not communicated on ASN's and often determined only after the product arrives. Originally when product arrived, the paperwork for the receipt made it into the office and the lot codes from the paperwork or system review could be used to determine that the lots were correct and properly entered before the receipt was confirmed. When a lot was improperly recorded, someone have to go back out to the warehouse to verify and correct it. In the case where a date is being extracted from the lot, standard logic can be executed to verify it is a proper date. Lot verification was added and is executed during any initial lot capture process to help ensure a valid lot code was captured. There is also an exit point that can run custom logic for an account, if some non-traditional logic it required for verification.
Via this exit, we can also perform some transformations on the lot that is captured from the packaging. Some customers use bar codes or printed representations that contain control characters or other debris that needs to be removed or transformed when the lot is recorded to communicate the lot back to the customer on their electronic documents.
Standardize Approach to 947 Reason Filtering
The 947 standard EDI document is used to communicate an adjustment to inventory back to a customer. Customers vary greatly in what adjustments they want to see. Stock condition changes, location moves, and special processing via value added services affect what the customer wants to see or not see. This was leading to a proliferation of custom code. A more standardized approach was taken, and reason codes can now be passed into the triggering logic to filter those adjustments that should be communicated.
Addition of MOVELP and HMOVELP
For LP based accounts, serial numbers are scanned during picking processes and serial activities are posted if everything goes according to plan and communications are not dropped part way through the process. A file was added to the system to track from and to LP numbers during a move and to record the serials involved for audit purposes in the history. This eliminates the loss of serial information when there is a communication failure and provides better audit information for research.
Account Management and Metrics
Metrics Repository and Constraints
Real-time metrics are becoming popular in many of our customer facilities. Monitors are placed in public locations and a series of panels are displayed to show the volume of work in the schedule and the current progress on that work. Some of these metrics can require significant data extraction and number crunching to calculate. The metrics display programs that show these panels are written to pull these values from a metric repository. Specific metric values can then be given a freshness period. When a metric is retrieved, the freshness timestamp is checked. If the value is within its freshness period, no further action is taken. If a metric is past the freshness period, the job that refreshes the metric is submitted so it can be recalculated for the next retrieval.
Metric extractors are written to accept a constraint set as a parameter. The extractor logic will calculate and store one or more values, such as the number of number of orders scheduled to ship and the number that are complete. If no constraint set is designated, the extractor will calculate these values across the organization. Using a constraint set you can limit the result to a specific building, account, or type of shipment like LTL shipments. This allows you to tune a specific display to a building or area in the facility.
Activity Performance Metrics
Metric extractors were written for both inbound and outbound transactions that each produce several metrics for the supplied from and to timestamps. The total volume and values for pending, active, and completed quantities for appointments, transactions, lines, pallets, cases, and weights are available. The building level display program was introduced to show these values on a monitor. The HTML used to format the display is passed in as a parameter. Using a combination of constraint sets and HTML formats, a variety of panels can be presented by the same basic logic. Thus, a panel that shows three pie charts of inbounds showing receipts, line, and pallets and a panel that shows outbound parcel shipments in a histogram can be produced without additional programming.
Metrics Scripting
Metrics scripting allows definition of a never-ending script that will play out on a metric display. Functions are used to define a call to a metrics display program and define the parameters required for the particulars of the display. A sequence is given to establish the order of presentation. Once a script is set, a profile can be established to run a specific script by directing their initial program to the script processor and identifying the script to display. The metrics panel is configured to launch the browser as the initial program and supplies the user profile. When the monitor is powered up, it launches the script and starts the display.
Inbound / Outbound Activity Metrics
Metrics panels were developed that can give a more comprehensive view of inbound and outbound activity. These panels accept parameters that not only identify the constraint set to use, but also provide secondary or even tertiary constraints that are used to provide cross-tabulated data. Thus, not only can you show the counts and volumes of inbound shipments for the day, but you can also show they broken down by load type - palletized, floor loaded, slip sheet, and by appointment type - live unload, drop trailer. Like the building metrics, the HTML format is passed so that you can have several options for how to present the metrics.
Labor Activity Reporting
Labor activity reporting was enhanced to capture operator's time allocated to each account they work in. The traditional time reporting using move history focused on transactional time (Assigned to complete) which was not reflective of the actual time the operator spends performing a transaction. Access to labor planning history allows logic to calculate a more accurate duration for these activities. Using this information, three enhancements were made.
First, logic was added to record the actual "Start" time of the transaction in addition to the assigned time. This allows WDLS to track the time from assignment to the time completed as well as the actual start time (The time the operator started the transaction) to the completion time. The second change was to support recording any non-transactional time (In Direct time) an operator may have such as meetings, forklift PM, and warehouse cleaning. Both changes were originally introduced in WDLS 3.2
The third change included in this release creates several extracts and reports that take the true transaction time (Start to Completion) and add in this non-transactional time between various activities. This "Gap" time is applied to the previous transaction. For example: you complete 5 picks on an order that each took 60 seconds. After the fifth pick you drive to the staging area, wrap the pallet, apply labels/placards, and then place the pallet down. If that takes an additional 5 minutes the activity reporting will apply to the order you just completed. This gives your operations teams a much better idea of what the handling costs are for each account and can provide a more accurate picking metric.
Labor Productivity Metrics
A labor metrics panel was developed that displays all active operators and displays their performance based upon the published standards. Performance standards can be recorded by activity and can vary by account or building.
Account Setup Import Enhancements
Several improvements were made to the opening balance import. All inventory models are supported. LP's and lot codes are created as are locater records. Standard logic to set lot FIFO sequencing is used. Lot unit weights can now be imported. A comment can be supplied for the opening adjustment of each item.
In addition to more extensive opening balance imports, several master file and peripheral data imports were introduced. Consignees, consignee items, item cross references, item preferred locations, pick line locations, shelf life are all supported by the import processes. Item attributes which are sometimes used for special setup needs can now be imported. Map imports were expanded to support location cross reference/check phrase import. Imports were also added to help set up labor plans - labor groups, plans, and users were added.
Expanded Data Extraction Capabilities
WDLS has always had good data definition that supports queries to provide ad hoc reporting capabilities. When the query is directed to an output file, a Codeworks utility can be used to convert to a comma separated variable file and email it. Extensions to the extract utility included in the last release allow extraction to an XLM or HTML format. With proper setup, a cascading style sheet (CSS) can be referenced in the headers of these documents which allows you to brand these documents and include logos. You can also include specific copyright of other legal language at the bottom of these extracts.
In addition to these more generic capabilities, most of the system generated reporting is now capable of producing an extract file. Specific extract processes for reporting and managing exceptions and retrieving transaction activities were consolidated. See the financial topic for more detail on extracts for invoicing and receivables automation.
Master bill Exit Point
Bill of ladings, warehouse receipts, packing lists, and pick sheets have been driven by exit points for some time. Master bills had little variety and were handled with multiple options from the Work With Loads panel. A new exit point was defined (MASTERBOL) that now is used by processing logic to determine the master bill program to call. This eliminates the proliferation of custom options and makes things more consistent with other processes.
Gift Cards
As more fulfillment accounts have been added, a need to have a standard gift card process was introduced. A new type of order comment was introduced and is supported by the 940 processes. The GIFTCARD exit point was added to name the logic that produces the cards. Options were added to work with orders and work with batches for card production.
Many More Purges
Data volumes are becoming an issue. As we have added more and more capabilities to the system, there have been several additional tables and directories established. The major purge processes have been kept current with new files, but there is now data building up that is far more temporary in nature. The FTP purge processes were revisited, and all temporary interface files are addressed. As more customers move to API interfaces, purges for work directories have been added. A utility to purge any directory by file date was added to help manage publishing directories and extracts. Purges for the dock scheduler and user function logging were added.
Mining The Move History
There is a tremendous amount of information captured and placed in the move history. It can be mined for several purposes, including activity volumes, facility performance, associate performance, inventory accuracy, redirection activities, cycle count accuracy, and more. Numerous extracts were added to the release to capture and structure this information.
Dock Scheduler
Dock Schedule Performance Metrics Panel
Appointment based metrics were introduced in this release. The metrics were included in the inbound and outbound extractors. Pending, active, and completed values are available. Average appointment time and average unload/load out are also calculated. Filtering is supported by building, account, and appointment type (live load/unload or drop trailer). When account filtering is used, if any shipment on the appointment is for the filtered account, the appointment gets counted.
Dock Appointment Performance Extract
An extract was created to track appointments in which the driver was NOT late, and the detention was greater than the allowed time. Parameters are used to allow you to specify a buffer to determine when the driver is late and to specify the duration to use for calculating if the detention period was exceeded. It is used to monitor if the warehouse is moving drivers in and out within the expected time frames.
Dock Appointment Based Electronic Paperwork
Options and logic were added to create, display, and capture signatures at the appointment level to streamline shipping office procedures.
Appointment Based Auto-Receive
Typical receiving verifies each item, captures lots and serial numbers, and ensures the location where each item was placed in the warehouse. In fast-moving distribution operations and contract operations, dock bays are at a premium. Suppliers ASN's are trusted and accurate and often ownership and responsibility for inventory discrepancies are not as onerous as in normal 3PL relationships. A quick auto-receiving process was developed for these operations. An option was created to capture a location for an appointment. The logic goes to each tally on that appointment and writes the put-away location onto every detail line for those tallies and confirms them as received complete.
Electronic Commerce
Comments, Detail Added to 810 Processing
The 810 document is used to deliver invoicing via EDI. Considerable updating has been done to the 810 process to convey detail on all forms of invoicing - assessorial, receiving, summary, storage, etc. The process was updated to also supply invoice comments in this release.
940 Order Processing Enhancements
New order processing was getting difficult to manage due to the number of order sources and processes. Changes were introduced to processing logic to add more control and avoid collisions from the multitude of sources. Searches over the holding files and better clean up and tooling were introduced.
Standard Tally, Order, and Requisition Imports
When a customer does not have sufficient expertise for more traditional electronic data exchange, they may be able to email a spreadsheet with their orders, ASN's or requisition requests. Standard utilities were added to WDLS to import this data into holding files for verification and processing. The spreadsheets are opened, columns to map are identified, and the data is saved to the integrated file system (IFS) in a CSV file format. Documentation was added explaining the import process that identifies all the fields that can be mapped.
XML FTP 940 945 943 944 947 and 832
XML formats and processing for most of the standard EDI inbound and outbound processes were included in the release.
Tooling for Stream Formatted Data
Data being delivered via API's arrives in streams as opposed to database tables. Tooling and utilities were developed to allow similar access to view, re-trigger processing, and provide audit trails for support and operations associates.
New Order Notification
A new order notification process was added to the release. When orders are processed in through the holding files, a summary list of the orders that arrived can be delivered to a distribution list.
Functional Acknowledgment Tools
Specific tooling was put in place to review and re-trigger functional acknowledgments. While functional acknowledgments are handled transparently for traditional X12 relationships in the Extol software, they are triggered using programming logic in API relationships. Recovering from a hiccup in these exchanges can vary quite a bit. It was decided to treat the 997 much like our other outbound documents, write a trigger for them and use the existing trigger work with panel and options to re-trigger and send them to help with recovery.
Warehouse Operations
Enhanced Batching Capabilities
As more and more orders are being shipped as parcels, the need for improvements to order selection for batching surfaced. Batching was originally controlled by a carrier limits file that used cutoff thresholds for quantity, order count, and weight to organize the orders into batches. With the much higher order counts that are being processed in fulfillment operations, a lot more intelligence in order batching was required. Single line orders for a product promotion, optimizing batches to a pallet's worth picking, and grouping by destination all have been introduced as business requirements.
Coming of Age of Batching and Batch Processing
Batch picking was introduced to reduce travel times and de-clutter traffic within the aisles of the warehouse. RF and voice picking processes were added to pick a batch of orders in bulk and transfer the product to a staging area for final sorting into orders for verification and labeling.
Order Verification
An order verification process was added that controls and records verification that the order was properly picked and is complete. The controls set the amount of verification to perform. A set of order verification tables were introduced to log the verification information for audit purposes. At the header level the start and finish times for the verification are recorded along with the verifications that were performed. At the detail level individual item scans are recorded. Should an order fail verification, it can be re-verified or directed to a hospital line. Once any discrepancy is resolved, the order can be taken back through verification. A verification report was added to extract metrics from the verification records.
Batch Verification and Sortation
Batch verification processes were introduced to assist in breaking down batch picked items to their respective orders and insure efficient shipping processes. The batch verification process was designed for accounts that ship products by the case without over-packing. The batch is picked and brought to a processing station. As items are identified, they are matched to the underlying orders and labels and paperwork is produced. At the end of the items, the batch should show complete. Any shortage will identify the order(s) that were shorted.
The sortation process is used when over packing is required. As items are scanned, the logic matches each item to order and identifies a bin for each item. Once all the items for an order are scanned, all packing paperwork is produced, and that bin can be sent down the line for final verification and closed out.
Introduce Zone Usage Flags
Warehouse zones are being more heavily used to match items to appropriate warehouse locations. A lot of custom logic started to be introduced to attach specific usage to zones and the zone designations varied in each implementation. The warehouse zone table was modified to add flags for usage. Processing logic was modified to check these usage flags before selecting, suggesting, or allowing a location based upon these zone designations.
Labor Planning
Labor Planning Enhancements
Process sub-steps were introduced to the labor planner. This allows for a more workflow like implementation of labor activities and assignments. Multiple picking processes can now be defined, and processing steps defined specific to the accounts and operational needs.
Change Requests
Change requests were introduced to allow specific assignment of a task to a specific operator. This is intended as an exception, not as a technique to implement tasking. The intent of labor planning is to identify workers, skill sets, and equipment and establish a plan using the available workforce to achieve a unit of work. If this planning is done well, the system logic should provide the most efficient assignment of activities. Occasionally, an associate may have special skills or product knowledge that is not part of the plan where specific assignment of a task can be used to insure, they are the one to perform that work. Change requests provide a mechanism for achieving this.
Load Out
Load out processing and audit logging came into it's own with the implementation of labor planning. The implementations tended to be very operation specific. The custom implementations were examined, processing steps were broken out, and a core set of load out capabilities were introduced in the release. The warehouse zone table was modified to add flags for usage. Processing logic was modified to check these usage flags before selecting, suggesting, or allowing a location based upon these zone designations.
Layer Picking
Layer picking was designed to separate the cases from the layers during order picking thus taking pressure off the case pick lines. This allows your operations team to set an area of the warehouse (A zone) in which they want to direct all layer picks. The items in the pick line can “Flex” each day as the order demand changes. So, you can place different items with significant layer picks in the zone each day as order demand dictates.
Within the layer zone an item may be placed in any location or in as many locations as you wish. This “Flexing” allows the operation teams to place product in the area based upon the order demands. There is a report that reviews orders based upon ship date and estimates the number of layers that will be needed. Remember, picking layers can impact the stock rotation so it’s important that the commit logic supports the customer requirements as well as trying to maximize the number of layers picked from the layer area. A custom commitment scheme may be required to address specific business needs. Typically, if the layer pick area does not contain enough layers to fill an order, the order will still be committed from the pick line.
Manufacturing Support
Clean up of Sequence Commodity Processing
Add logic similar logic to that used on tallies and orders that allows re-use of a requisition reference number if it is over 6 months old.
Kitting Area Replenishment
Functionality was added to WSSL to auto replenish the kit area based upon the upcoming kit demands. By reviewing the kit data and the production schedule, WSSL can estimate any shortages that may exist in the kitting area. This logic will consider “Mixed parts” in which the same part is used in more than one kit area. When the calculation determines that a replenishment is needed, a move is posted to the move queue.
Kit Replenishment Labels are produced for each and tied back to the scheduled move on the move queue. These labels are handed to the replenishment operators, and they perform the move much like a requisition pick is completed today. This ensures that the parts required by kitting and sequencing operations will be available as needed.
Yard Management
YMS Web Enhancements
Yard management capabilities were added to the WDLS application 10 years ago. The YMS implementation has been migrating to more of a web portal application since that early implementation. Guard shacks and yard trucks are often the source of information and activities and a web portal as opposed to an in-house network better fits these operations. Further the web portal implementation can provide customer access to yard activities especially when the 3PL is managing a customer yard. The web portal implementation was brought into core WDLS in this release.
Financial Reporting, Invoicing, Accounts Receivable
eMail Support added to A/R Summary, Revenue Summary, and Revenue Distribution
An option was added to each of these that allows each of these reports to be delivered as an email attachment. When this option is taken, the contents of the report are delivered as a CSV file to the specified email address.
Suppress zero balance customer statements
A change was made to customer statement processing that suppresses statements with a zero-balance due. This allows a simple process to produce all customer statements for everyone with an open balance. In the past a statement was produced whether the customer had a balance or not.
Support CSV extract for all types of invoicing
A new command and functionality were added An option was added to each of these that allows all forms of invoice to be extracted and delivered as CSV, XLM, or TEXT files. This can be run across all accounts, for specific buildings, or for a specific account. Date ranges and summary level are also supported - total, summary, or detail. Status allows selection of all, paid, due, or credits. The resulting files are emailed to the specified address.
Recurring/Memorized Invoices
A New table was created to store information for memorized general invoices. An option was added to the work with general invoices panel that allows you to memorize an invoice. When taken, it allows you to supply a description, and captures special values for frequency, billing period, and invoice date. An option was added to the ARMENU to generate memorized invoices. When taken, the last run date is checked. If the days since the last invoice exceed the frequency setting, a new invoice is created using the memorized invoice as a shell. Alternately, for the Work With Memorized Invoices panel, there is an option to generate any of the memorized invoices individually.
Post Parcel Shipping Charges as Special Charges for Assessorial Invoicing
LLogic was developed that can be placed as a program to execute at the SCPOST exit point. When an order is closed, the logic will check the package file for the order. If shipping costs are recorded, it will sum the amount from all packages and post/update a special charge for the order. The shipping charges will then appear on the assessorial invoice. The special charge is based upon a task code of PRCL (parcel). The logic checks for a storer specific entry and otherwise uses a generic setting. If a markup percentage is set on the task, the markup is applied when the special charge is posted. If a special charge for the PRCL task exists on the order, the logic is bypassed.
Work Order Invoicing Enhancements
Release level charging was better documented. This allows charging at different rates based upon the size of the release. Special charges were added, and they can be entered at the release level. There were format changes to give a more understandable invoice. A PDF version of the invoice was introduced. Invoice numbers are now removed from a release if an invoice is cancelled so that it can be re-invoiced. Flags were added to control the display of reference numbers, purchase order, and kit descriptions.
Base Utilities
Display IFS File Contents
Codeworks incorporated tooling for FTP based trading partners to access the inbound and outbound queues, view the contents, and in some instances reprocess. This allowed you to use work processes like traditional EDI tooling for managing these relationships and resolving problems. With more customers using stream files and API interfaces, there was a need to similar tooling to help manage directories on the IFS. A standard work with style panel was created that will show the contents of a directory and options to display, show file details, re-process and archive.
Web User - Work With Published Documents
A panel was added to the web portal that allows your users to access files, documents, and reports that you publish for them. The file must be published under a directory established for the client under the root folder of your web instance. Subdirectories can be used to isolate different types of information - BOL's, Invoices, Reports ...
Titles for File Extract Emails
WDLS has had a utility for some time that allows you to email any file from your system. Put together with a query, this gives you good functionality to provide your customer with custom or ad hoc requests. This type of capability is being used more and more and as a result you can have an inbox jammed with entries. In this release, a title for the email was added to the process that allows you to more easily distinguish different requests.
Support XLM, HTML Extracts for Any Data File
The file extraction utility has long supported text and comma separated values as format options. In this release, hypertext markup (HTML) and excel (XLM) are also supported. Both allow you to include a shell that wraps around the extracted data. That shell can contain a reference to a cascading style sheet or embed styling to infuse your corporate brand into the documents. It can also be used to include a copyright notification at the bottom of the document.
JSON Data Parser
Java Script Object Notation (JSON) has become a more commonly used formatting standard for documents. A utility was added to our base utility set that allows for parsing of JSON data into data files.